Nonconvex optimization using negative curvature within a modified linesearch

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonconvex optimization using negative curvature within a modified linesearch

This paper describes a new algorithm for the solution of nonconvex unconstrained optimization problems, with the property of converging to points satisfying second-order necessary optimality conditions. The algorithm is based on a procedure which, from two descent directions, a Newton-type direction and a direction of negative curvature, selects in each iteration the linesearch model best adapt...

متن کامل

Exploiting Negative Curvature Directions in Linesearch Methods for Unconstrained Optimization

In this paper we consider the deenition of new eecient linesearch algorithms for solving large scale unconstrained optimization problems which exploit the local nonconvexity of the objective function. Existing algorithms of this class compute, at each iteration, two search directions: a Newton-type direction which ensures a global and fast convergence, and a negative curvature direction which e...

متن کامل

Data and performance profiles applying an adaptive truncation criterion, within linesearch-based truncated Newton methods, in large scale nonconvex optimization

In this paper, we report data and experiments related to the research article entitled "An adaptive truncation criterion, for linesearch-based truncated Newton methods in large scale nonconvex optimization" by Caliciotti et al. [1]. In particular, in Caliciotti et al. [1], large scale unconstrained optimization problems are considered by applying linesearch-based truncated Newton methods. In th...

متن کامل

A Modified Regularized Newton Method for Unconstrained Nonconvex Optimization

In this paper, we present a modified regularized Newton method for the unconstrained nonconvex optimization by using trust region technique. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the modified regularized Newton method (M-RNM) has a global convergence property. Numerical results show that the algorithm is very efficient.

متن کامل

A Linesearch Algorithm with Memory for Unconstrained Optimization

This paper considers algorithms for unconstrained nonlinear optimization where the model used by the algorithm to represent the objective function explicitly includes memory of the past iterations. This is intended to make the algorithm less \myopic" in the sense that its behaviour is not completely dominated by the local nature of the objective function, but rather by a more global view. We pr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: European Journal of Operational Research

سال: 2008

ISSN: 0377-2217

DOI: 10.1016/j.ejor.2006.09.097